You are here: Symbol Reference > Dew Namespace > Dew.Math Namespace > Dew.Math.Units Namespace > Classes > Optimization Class > Optimization Methods > ConjGrad Method > Optimization.ConjGrad Method (TRealFunction, TGrad, double[], [In] double[], [In] object[], out double, out TOptStopReason, [In] TMtxFloatPrecision, bool, bool, int, double, double, [In] TStrings)
Dew Math for .NET
ContentsIndexHome
PreviousUpNext
Optimization.ConjGrad Method (TRealFunction, TGrad, double[], [In] double[], [In] object[], out double, out TOptStopReason, [In] TMtxFloatPrecision, bool, bool, int, double, double, [In] TStrings)

Minimizes the function of several variables by using the Conjugate gradient optimization algorithm.

Syntax
C#
Visual Basic
public static int ConjGrad(TRealFunction Fun, TGrad Grad, ref double[] Pars, [In] double[] Consts, [In] object[] ObjConst, out double FMin, out TOptStopReason StopReason, [In] TMtxFloatPrecision FloatPrecision, bool FletcherAlgo, bool SoftLineSearch, int MaxIter, double Tol, double GradTol, [In] TStrings Verbose);
Parameters 
Description 
TRealFunction Fun 
Real function (must be of TRealFunction type) to be minimized. 
TGrad Grad 
The gradient and Hessian procedure (must be of TGrad type), used for calculating the gradient. 
ref double[] Pars 
Stores the initial estimates for parameters (minimum estimate). After the call to routine returns adjusted calculated values (minimum position). 
[In] double[] Consts 
Additional Fun constant parameteres (can be/is usually nil). 
[In] object[] ObjConst 
Additional Fun constant parameteres (can be/is usually nil). 
out double FMin 
Returns function value at minimum. 
out TOptStopReason StopReason 
Returns reason why minimum search stopped (see TOptStopReason). 
[In] TMtxFloatPrecision FloatPrecision 
Specifies the floating point precision to be used by the routine. 
bool FletcherAlgo 
If True, ConjGrad procedure will use Fletcher-Reeves method. If false, ConjGrad procedure will use Polak-Ribiere method. 
bool SoftLineSearch 
If True, ConjGrad internal line search algoritm will use soft line search method. Set SoftLineSearch to true if you're using numerical approximation for gradient. If SoftLineSearch if false, ConjGrad internal line search algorithm will use exact line search method. Set SoftLineSearch to false if you're using *exact* gradient. 
int MaxIter 
Maximum allowed numer of minimum search iterations. 
double Tol 
Desired Pars - minimum position tolerance. 
double GradTol 
Minimum allowed gradient C-Norm. 
[In] TStrings Verbose 
If assigned, stores Fun, evaluated at each iteration step. Optionally, you can also pass TOptControl object to the Verbose parameter. This allows the optimization procedure to be interrupted from another thread and optionally also allows logging and iteration count monitoring.  

the number of iterations required to reach the solution(minimum) within given tolerance.

Problem: Find the minimum of the "Banana" function by using the Conjugate gradient method. 

Solution:The Banana function is defined by the following equation: 

 

Normally ConjGrad method would also require gradient procedure. But in this example we'll use the numerical approximation, more precisely the MtxIntDiff.NumericGradRichardson routine. This is done by specifying NumericGradRichardson routine as Grad parameter in ConjGrad routine call (see below) 

 

// Objective function private double Banana(TVec x, TVec c, params object[] o) { return 100*Math387.IntPower(x[1] - Math387.IntPower(x[0],2),2) + Math387.IntPower(1-x[0],2); } private void Example() { double[2] Pars; double fmin; TOptStopReason StopReason; // initial estimates for x1 and x2 Pars[0] = 0; Pars[1] = 0; int iters = Optimization.ConjGrad(Banana, MtxIntDiff.NumericGradRichardson, Pars, null, null, out fmin, out StopReason, mvDouble, true, false, 1000, 1.0e-8, 1.0e-8, null); // stop if Iters >1000 or Tolerance < 1e-8 }
Copyright (c) 1999-2024 by Dew Research. All rights reserved.
What do you think about this topic? Send feedback!